- linearly constrained optimization
- оптимизация при линейных ограничениях
English-Russian scientific dictionary. 2008.
English-Russian scientific dictionary. 2008.
Multidisciplinary design optimization — Multi disciplinary design optimization (MDO) is a field of engineering that uses optimization methods to solve design problems incorporating a number of disciplines. As defined by Prof. Carlo Poloni, MDO is the art of finding the best compromise … Wikipedia
Nelder–Mead method — Nelder–Mead simplex search over the Rosenbrock banana function (above) and Himmelblau s function (below) See simplex algorithm for Dantzig s algorithm for the problem of linear opti … Wikipedia
Lagrange multipliers on Banach spaces — In the field of calculus of variations in mathematics, the method of Lagrange multipliers on Banach spaces can be used to solve certain infinite dimensional constrained optimization problems. The method is a generalization of the classical method … Wikipedia
ABS methods — ABS methods, where the acronym contains the initials of Jozsef Abaffy, Charles G. Broyden and Emilio Spedicato, have been developed since 1981 to generate a large class of algorithms for the following applications: * solution of general linear… … Wikipedia
Linear programming — (LP, or linear optimization) is a mathematical method for determining a way to achieve the best outcome (such as maximum profit or lowest cost) in a given mathematical model for some list of requirements represented as linear relationships.… … Wikipedia
List of numerical analysis topics — This is a list of numerical analysis topics, by Wikipedia page. Contents 1 General 2 Error 3 Elementary and special functions 4 Numerical linear algebra … Wikipedia
Newton's method — In numerical analysis, Newton s method (also known as the Newton–Raphson method), named after Isaac Newton and Joseph Raphson, is a method for finding successively better approximations to the roots (or zeroes) of a real valued function. The… … Wikipedia
Gauss–Newton algorithm — The Gauss–Newton algorithm is a method used to solve non linear least squares problems. It can be seen as a modification of Newton s method for finding a minimum of a function. Unlike Newton s method, the Gauss–Newton algorithm can only be used… … Wikipedia
Harmony search — (HS) is a metaheuristic algorithm (also known as soft computing algorithm or evolutionary algorithm) mimicking the improvisation process of musicians. In the process, each musician plays a note for finding a best harmony all together. Likewise,… … Wikipedia
2-satisfiability — In computer science, 2 satisfiability (abbreviated as 2 SAT or just 2SAT) is the problem of determining whether a collection of two valued (Boolean or binary) variables with constraints on pairs of variables can be assigned values satisfying all… … Wikipedia
Eigenvalues and eigenvectors — For more specific information regarding the eigenvalues and eigenvectors of matrices, see Eigendecomposition of a matrix. In this shear mapping the red arrow changes direction but the blue arrow does not. Therefore the blue arrow is an… … Wikipedia